- Mar|kov chain
- Mar|kov chain «MAHR kf»,Statistics. a succession of random events each of which is determined by the event immediately preceding it: »
In its simplest form, a Markov chain states that the probability of a succeeding event occurring is dependent upon the fact that a preceding event occurred. For example, if the letter Q is known to exist, what is the probability of it being followed by the letter U? (John P. Dowds).
╂[< Andrei Markov, 1856-1922, a Russian mathematician]
Useful english dictionary. 2012.